Sbert Chinese General V2 Distill
This is a Chinese sentence embedding model suitable for general semantic matching scenarios. Through distillation technology, it has been reduced from a 12-layer BERT to a 4-layer model, significantly improving inference speed while maintaining good performance.
Text Embedding
Transformers